44 research outputs found

    Rethinking Traditional Web Interaction: Theory and Implementation

    Get PDF
    Abstract—In recent years, Web sites evolved into ever more complex distributed applications. But current Web programming tools are not fully adapted to this evolution, and force programmers to worry about too many inessential details. We want to define an alternative programming style better fitted to that kind of applications. To do that, we propose an analysis of Web interaction in order to break it down into very elementary notions, based on semantic criteria instead of technological ones. This allows defining a common vernacular language to describe the concepts of current Web programming tools, but also some other new concepts. We propose to use these new concepts to create new frameworks for programming Web applications. This results in a significant gain of expressiveness. The understanding and separation of these notions also makes it possible to get strong static guarantees, that can help a lot during the development of complex applications, for example by making impossible the creation of broken links. We show how most of the ideas we propose have been implemented in the Ocsigen Web programming framework. Ocsigen makes possible to write a client-server Web applications as a single program and the interaction model we propose is fully compatible with this kind of applications. Keywords–Typing; Web interaction; Functional Web programming; Continuation

    Strong Normalization by Type-Directed Partial Evaluation and Run-Time Code Generation (Preliminary Version)

    Get PDF
    We investigate the synergy between type-directed partial evaluation and run-time code generation for the Caml dialect of ML. Type-directed partial evaluation maps simply typed, closed Caml values to a representation of their long beta-eta-normal form. Caml uses a virtual machine and has the capability to load byte code at run time. Representing the long beta-eta-normal forms as byte code gives us the ability to strongly normalize higher-order values (i.e., weak head normal forms in ML), to compile the resulting strong normal forms into byte code, and to load this byte code all in one go, at run time.We conclude this note with a preview of our current work on scalingup strong normalization by run-time code generation to the Camlmodule language

    Normalization by Evaluation for Call-by-Push-Value and Polarized Lambda-Calculus

    Get PDF
    We observe that normalization by evaluation for simply-typed lambda-calculus with weak coproducts can be carried out in a weak bi-cartesian closed category of presheaves equipped with a monad that allows us to perform case distinction on neutral terms of sum type. The placement of the monad influences the normal forms we obtain: for instance, placing the monad on coproducts gives us eta-long beta-pi normal forms where pi refers to permutation of case distinctions out of elimination positions. We further observe that placing the monad on every coproduct is rather wasteful, and an optimal placement of the monad can be determined by considering polarized simple types inspired by focalization. Polarization classifies types into positive and negative, and it is sufficient to place the monad at the embedding of positive types into negative ones. We consider two calculi based on polarized types: pure call-by-push-value (CBPV) and polarized lambda-calculus, the natural deduction calculus corresponding to focalized sequent calculus. For these two calculi, we present algorithms for normalization by evaluation. We further discuss different implementations of the monad and their relation to existing normalization proofs for lambda-calculus with sums. Our developments have been partially formalized in the Agda proof assistant

    Normalization by evaluation for call-by-push-value and polarized lambda calculus

    Get PDF
    We observe that normalization by evaluation for simply-typed lambda-calculus with weak coproducts can be carried out in a weak bi-cartesian closed category of presheaves equipped with a monad that allows us to perform case distinction on neutral terms of sum type. The placement of the monad influences the normal forms we obtain: for instance, placing the monad on coproducts gives us eta-long beta-pi normal forms where pi refers to permutation of case distinctions out of elimination positions. We further observe that placing the monad on every coproduct is rather wasteful, and an optimal placement of the monad can be determined by considering polarized simple types inspired by focalization. Polarization classifies types into positive and negative, and it is sufficient to place the monad at the embedding of positive types into negative ones. We consider two calculi based on polarized types: pure call-by-push-value (CBPV) and polarized lambda-calculus, the natural deduction calculus corresponding to focalized sequent calculus. For these two calculi, we present algorithms for normalization by evaluation. We further discuss different implementations of the monad and their relation to existing normalization proofs for lambda-calculus with sums. Our developments have been partially formalized in the Agda proof assistant

    Taking ethanol quality beyond fuel grade: A review

    Get PDF
    Ethanol production in the United States approached 15 billion gal/year in 2015. Only about 2.5% of this was food‐grade alcohol, but this represents a higher‐value product than fuels or other uses. The ethanol production process includes corn milling, cooking, saccharification, fermentation, and separation by distillation. Volatile byproducts are produced during the fermentation of starch. These include other alcohols, aldehydes, ketones, fatty acids and esters. Food‐grade ethanol is generally produced by wet milling, where starch and sugars are separated from the other corn components, resulting in much smaller concentrations of the impurities than are obtained from fermentation of dry‐milled corn, where cyclic and heterocyclic compounds are produced from lignin in the corn hull. Some of these volatile byproducts are likely to show up in the distillate and these fermentation byproducts in ethanol could cause unpleasant flavours and affect human health if used for human consumption. There is some interest in improving ethanol quality, since human consumption represents a higher value. Advanced purification techniques, such as ozone oxidation, currently used for drinking water and municipal wastewater treatment, offer possibilities for adaptation in ethanol quality improvement. The development of analytical techniques has enabled the detection of low‐concentration compounds and simple quality assurance of food‐grade alcohol. This review includes the most recent ethanol production methods, potential ethanol purification techniques and analytical techniques. Application of such techniques would aid in the development of simplified alcohol production

    Ocsigen : approche fonctionnelle typée de la programmation Web

    No full text
    National audienceVers la début des années 2000, Christian Queinnec, John Hughes et Paul Graham ont, indépendamment, montré une application intéressante du concept de continuation utilisé en programmation fonctionnelle, pour décrire le fonctionnement du bouton « back » dans une application Web. L'utilisation de cette découverte permet de simplifier l'implémentation de certains comportements et d'éviter des erreurs courantes. En fait ce premier effort d'abstraction peut être poussé beaucoup plus loin. Il est en effet possible d'inventer des concepts de haut niveaux pour modéliser d'autres comportements courants sur le Web, liés par exemple aux sessions ou à d'autres modes d'interaction avec l'utilisateur par l'intermédiaire d'une interface Web. Nous proposons d'abord de définir un nouveau langage vernaculaire pour décrire l'interaction Web, en s'appuyant sur la sémantique plutôt que sur les contraintes technologiques. L'implémentation directe de ces concepts abstraits dans des outils de programmation Web apporte un fort gain d'expressivité, en simplifiant la programmation de comportements complexes. Cela peut aboutir notamment à une meilleur ergonomie. Cette abstraction permet également de fortes garanties statiques qui améliorent la robustesse des applications. Il en découle un nouveau style de programmation Web qui répond parfaitement aux besoins liés à l'évolution actuelle du Web. Nous en montrerons une implémentation en OCaml sous la forme d'un module pour le serveur Web Ocsigen, appelé Eliom

    Keeping sums under control

    No full text
    vincent.balat @ pps.jussieu.fr This paper presents a normalization tool for the λ-calculus with sum types, based on the technique of normalization by evaluation, and more precisely techniques developed by Olivier Danvy for partial evaluation, using control operators. The main characteristic of this work is that it produces a result in a canonical form. That is to say: two βη-equivalent terms will be normalized into (almost) identical terms. It was not the case with the usual algorithm, which could even lead to an explosion of the size of code. This canonical form is an η-long β-normal form with constraints, which captures the definition of η-long normal form for the λ-calculus without sums, and reduces drastically the η-conversion possibilities for sums. We will show how this normalizer helped us to solve a problem of characterization of type isomorphisms. 1

    Une étude des sommes fortes : isomorphismes et formes normales

    No full text
    The goal of this thesis is to study the sum and the zero within two principal frameworks: type isomorphisms and the normalization of lambda-terms. Type isomorphisms have already been studied within the framework of the simply typed lambda-calculus with surjective pairing but without sums. To handle the case with sums and zero, I first restricted the study to the case of linear isomorphisms, within the framework of linear logic, which led to a remarkably simple characterization of these isomorphisms, obtained thanks to a syntactic method on proof-nets. The more general framework of intuitionistic logic corresponds to the open problem of characterizing isomorphisms in bi-cartesian closed categories. I contributed to this study by showing that there is no finite axiomatization of these isomorphisms. To achieve this, I used some results in number theory regarding Alfred Tarski's so-called ``high school algebra'' problem. The whole of this work brought about the problem of finding a canonical form to represent lambda-terms, with the aim of either denying the existence of an isomorphism by a case study on the form of the term, or checking their existence in the case of the very complex functions I was brought to handle. This analysis led us to give an ``extensional'' definition of normal form for the lambda-calculus with sums and zero, obtained by categorical methods using Grothendieck logical relations. Finally, I could obtain an ``intentional'' version of this result by using normalization by evaluation. By adapting the technique of type-directed partial evaluation, it is possible to produce a result in the new normal form, reducing considerably its size in the case of the type isomorphisms considered before.Le but de cette thèse est d'étudier la somme et le zéro dans deux principaux cadres : les isomorphismes de types et la normalisation de lambda-termes. Les isomorphismes de type avaient déjà été étudiés dans le cadre du lambda-calcul simplement typé avec paires surjectives mais sans somme. Pour aborder le cas avec somme et zéro, j'ai commencé par restreindre l'étude au cas des isomorphismes linéaires, dans le cadre de la logique linéaire, ce qui a conduit à une caractérisation remarquablement simple de ces isomorphismes, obtenue grâce à une méthode syntaxique sur les réseaux de preuve. Le cadre plus général de la logique intuitionniste correspond au problème ouvert de la caractérisation des isomorphismes dans les catégories bi-cartésiennes fermées. J'ai pu apporter une contribution à cette étude en montrant qu'il n'y a pas d'axiomatisation finie de ces isomorphismes. Pour cela, j'ai tiré partie de travaux en théorie des nombres portant sur un problème énoncé par Alfred Tarski et connu sous le nom du « problème des égalités du lycée ». Pendant tout ce travail sur les isomorphismes de types, s'est posé le problème de trouver une forme canonique pour représenter les lambda-termes, que ce soit dans le but de nier l'existence d'un isomorphisme par une étude de cas sur la forme du terme, ou pour vérifier leur existence dans le cas des fonctions très complexes que j'étais amené à manipuler. Cette réflexion a abouti à poser une définition « extensionnelle » de forme normale pour le lambda-calcul avec somme et zéro, obtenue par des méthodes catégoriques grâce aux relations logiques de Grothendieck, apportant ainsi une nouvelle avancée dans l'étude de la question réputée difficile de la normalisation de ce lambda-calcul. Enfin je montrerai comment il est possible d'obtenir une version « intentionnelle » de ce résultat en utilisant la normalisation par évaluation. J'ai pu ainsi donner une adaptation de la technique d' évaluation partielle dirigée par les types pour qu'elle produise un résultat dans cette forme normale, ce qui en réduit considérablement la taille et diminue aussi beaucoup le temps de normalisation dans le cas des isomorphismes de types considérés auparavant
    corecore